Data Synchronization

Data synchronization is a process of establishing consistency among the data from a source (OIPA through ODS Staging) to the Target database. The data synchronization between the OIPA and the Target databases is carried out in the below steps:

Step 1:

Move the Data from OIPA Transactional Database to ODS Staging Schema through data pump or data dump via IMPORT/EXPORT or IMPDMP/EXPDMP.

The DBA will create a complete dump of the initial data from OIPA database and move the schema and data to ODS Staging Schema. On completion of step 1, the ODS Staging Schema will have a copy of OIPA data.

Step 2:

Replicate OIPA data (instance data) to a Staging database via a data transformation tool (e.g. Golden Gate).

Step 3:

Replicate the instance data from the staging database into the Target Database based on the Job Scheduler frequency set by the user in the ODS utility.

ODS Staging Schema

The ODS Staging Schema include:

  • All AS* Tables- imported from OIPA data pump.
  • ODS Specific tables - generated via ODS utility either executed directly on the staging schema or manually via SQL scripts by DBA.
  • DSViews (used for Pivoting) - generated via ODS utility either executed directly on the staging schema or manually via SQL scripts by DBA.
  • Stored Procedures - used for Incremental and Initial Data movement between AS* tables on staging to DS tables on target schema.

Note: The stored procedures should be created manually via SQL Scripts by DBA.

  • Scheduled Jobs - will be used to execute database stored procedures.

Note: Jobs will be created via ODS UI, using database JOB Scheduler API. Each native database has a JOB scheduler that can create JOBs which can execute any native DB objects.

  • Report/Log Tables for the JOB status - only technical errors are logged for debugging purpose.

ODS Target Schema

The ODS Target Schema include:

  • All DS* tables - created by ODS either directly via utility or manually by DBA through SQL Scripts.

Note: Once the schema is created as per the satisfaction, DBA can revoke the DBA privileges for ODS user to create, alter and drop tables, views.

DB Job Execution

This is an atomic operation executed in Staging Database triggered by the ODS Job scheduler. The DB Jobs will be executed based on the schedule configured by user in the ODS UI.

The database JOB can be run once a day or may be more than once a day based on the frequency set by the user. On completion of the execution, either all records in that batch are successful and commit or all fail and rollback leaving the database in consistent manner.

The Job execution in Staging Database takes place as below:

  1. Each time a JOB is executed, the DB will run the Stored Procedure which will internally INSERT data using Pivoted Views to DS tables. For As-Is tables, direct SELECT and INSERT is used.
  2. Golden Gate will use messaging logs from the OIPA source DB to determine the updated/inserted records to pass to staging. These in turn will be captured by ODS to put into the insert/update scripts based on the timestamp of the last successfully completed ODS update job.

  1. Once a scheduled JOB completes, the stored procedure logs the details of the run such as:
    • Job Id
    • Job name
    • Job Status (Success)
    • Source table name (Ex: AsPolicy)
    • Target table name (Ex: DSPolicy)
    • Job Execution Date and Time - DateTimestamp
    • Record Count - (1000 | 100)
    • Error Message - Failure DB message, in case of error

When Stored Procedure is moving the incremental data from AS* Tables to DS* Tables, each successful commit or rollback will record the above details.

Note: For ASTables data in ODS Staging Schema, the user can configure the time period upto which the data should be available for access and to be purged on completing the time period.

 

 

Oracle Insurance Logo Copyright © 2017, Oracle and/or its affiliates. All rights reserved. About Oracle Insurance | Contact Us